Variational Structured Stochastic Network

نویسندگان

  • Hao Liu
  • Xinyi Yang
  • Zenglin Xu
چکیده

High dimensional sequential data exhibits complex structure, a successful generative model for such data must involve highly dependent, structured variables. Thus it is desired or even necessary to model correlations and dependencies between the multiple input, output variables and latent variables in such scenario. To achieve this goal, we introduce Variational Structured Stochastic Network(VSSN), a new method for modeling high dimensional structured data. Leveraging recent advances in Stochastic Gradient Variational Bayes, VSSN can overcome intractable inference distributions via stochastic variational inference(Hoffman et al., 2013; Ranganath et al., 2014). To evaluate the proposed model, we apply it to speech recording data, music data, and several dynamic image sequence modeling tasks. Experimental results have demonstrated that our proposed method can outperform most state-of-the-art methods.

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

Bayesian Sparsity for Intractable Distributions

Bayesian approaches for single-variable and group-structured sparsity outperform L1 regularization, but are challenging to apply to large, potentially intractable models. Here we show how noncentered parameterizations, a common trick for improving the efficiency of exact inference in hierarchical models, can similarly improve the accuracy of variational approximations. We develop this with two ...

متن کامل

Two Methods for Wild Variational Inference

Variational inference provides a powerful tool for approximate probabilistic inference on complex, structured models. Typical variational inference methods, however, require to use inference networks with computationally tractable probability density functions. This largely limits the design and implementation of variational inference methods. We consider wild variational inference methods that...

متن کامل

Learning Stochastic Recurrent Networks

Leveraging advances in variational inference, we propose to enhance recurrent neural networks with latent variables, resulting in Stochastic Recurrent Networks (STORNs). The model i) can be trained with stochastic gradient methods, ii) allows structured and multi-modal conditionals at each time step, iii) features a reliable estimator of the marginal likelihood and iv) is a generalisation of de...

متن کامل

Scaling Factorial Hidden Markov Models: Stochastic Variational Inference without Messages

Factorial Hidden Markov Models (FHMMs) are powerful models for sequential data but they do not scale well with long sequences. We propose a scalable inference and learning algorithm for FHMMs that draws on ideas from the stochastic variational inference, neural network and copula literatures. Unlike existing approaches, the proposed algorithm requires no message passing procedure among latent v...

متن کامل

Monte Carlo Structured SVI for Non-Conjugate Models

The stochastic variational inference (SVI) paradigm, which combines variational inference, natural gradients, and stochastic updates, was recently proposed for large-scale data analysis in conjugate Bayesian models and demonstrated to be effective in several problems. This paper studies a family of Bayesian latent variable models with two levels of hidden variables but without any conjugacy req...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2017